"Facebook knows its systems lead teenagers to anorexia-related content.
The company had to "break the glass" and turn back on safety settings after the 6 January Washington riots.
Facebook intentionally targets teenagers and children under 13.
Monday's outage that brought down Facebook, Instagram and WhatsApp meant that for more than five hours Facebook could not "destabilise democracies"."
"Joel and Anna have experienced this too, though Joel believes his tech is not inherently misogynistic. "Because I set it up, I know exactly the phrase that needs to be used and Anna doesn't," he explains. "She'll say it slightly wrong, then I say it and to her ear it sounds like I'm saying exactly the same thing in a calmer voice.""
"BEIJING: Handing over a piping hot meal at exactly the time promised, Chinese food delivery driver Zhuang Zhenhua triumphantly tapped his job as complete through the Meituan app -- and was immediately fined half of his earnings.
A glitch meant it inaccurately registered him as being late and he incurred an automatic penalty -- one of many ways, he said, delivery firms exploit millions of workers even as the sector booms.
Authorities have launched a crackdown demanding firms including Meituan and Alibaba's Ele.me ensure basic labour protections such as proper compensation, insurance, as well as tackling algorithms that effectively encourage dangerous driving."
"The collapse of SVB was the second-largest bank failure in the history of the United States. The largest, Washington Mutual in 2008, took place over the course of eight months. SVB's collapse played out in barely two days.
Anxious Twitter posts and WhatsApp exchanges, coupled with the ease of access that online banking provides, are seen by analysts as a serious catalyst for the current crisis. Experts suggest that in the social media age, the psychological behaviour behind a bank run - mass fear from depositors of losing their savings - may be amplified and go viral quicker than bank officers and regulators can successfully respond."
The ensuing controversy has sparked renewed debates about the ways in which algorithms can perpetuate biases, yielding unintended and often offensive results.
"The world's spookiest philosopher is Nick Bostrom, a thin, soft-spoken Swede. Of all the people worried about runaway artificial intelligence, and Killer Robots, and the possibility of a technological doomsday, Bostrom conjures the most extreme scenarios. In his mind, human extinction could be just the beginning."
""Most of us anganwadi workers don't have enough education to understand these apps," she added. "We don't get enough network in the village to use them, and we don't earn enough to recharge the phone on time. So what is the point?""
"But because of "security issues" that may include concerns that she is part of a wider espionage plot, both Ai-Da and her sculpture were held in Egyptian customs for 10 days before being released on Wednesday, sparking a diplomatic fracas."
"Ege Gürdeniz: There are two components to Artificial Intelligence (AI) bias. The first is an AI application making biased decisions regarding certain groups of people. This could be ethnicity, religion, gender, and so on. To understand that we first need to understand how AI works and how it's trained to complete specific tasks."
"There's no concrete evidence of long-term harm to our children caused by using our phones around them, but there's enough evidence of potential short-term effects that it makes sense to be mindful of it. Some amount of phone use around our kids is probably okay, but if we're absorbed in our devices in a way that interferes with our ability to connect with and respond to them, this can become a problem. Also, let's be kind to ourselves."
""In some ways we've lost agency. When programs pass into code and code passes into algorithms and then algorithms start to create new algorithms, it gets farther and farther from human agency. Software is released into a code universe which no one can fully understand.""
"Big data and artificial intelligence are some of today's most popular buzzwords. Both are promised to help deliver insights that were previously too complex for computer systems to calculate. With examples ranging from personalised recommendation systems to automatic facial analyses, user-generated data is now analysed by algorithms to identify patterns and predict outcomes. And the common view is that these developments will have a positive impact on society."
"Robots, artificial intelligence and smart speakers will ease the burden on doctors and give them more time with patients, according to an NHS report on the pending technological "revolution" in healthcare.
Developments in the ability to sequence individuals' genomes - the entirety of their genetic data - will also spur on advances, according to the review published on Monday."
""We have only examined a tiny fraction of this code base and found a critical, election-stealing issue," said Lewis, who is currently executive director of the Open Privacy Research Society, a Canadian nonprofit that develops secure and privacy-enhancing software for marginalized communities. "Even if this [backdoor] is closed its mere existence raises serious questions about the integrity of the rest of the code.""
"Interestingly, the time reversal algorithm itself could prove useful for making quantum computers more precise. "Our algorithm could be updated and used to test programs written for quantum computers and eliminate noise and errors," Lebedev explained."
"If that data is collected, does the child have a right to get it back? If that data is collected from very early childhood and does not belong to the child, does it make the child extra vulnerable because his or her choices and patterns of behaviour could be known to anyone who purchases the data, for example, companies or political campaigns.
Depending on the privacy laws of the state in which the toys are being used, if the data is collected and kept, it breaches Article 16 of the Convention on the Rights of the Child - the right to privacy. (Though, of course, arguably this is something parents routinely do by posting pictures of their children on Facebook). "
"Scientists must embrace circumspection, transparency, and robust ways of working that safeguard against bias and analytical flexibility. Doing so will provide parents and policymakers with the reliable insights they need on a topic most often characterized by unfounded media hype."
"Algorithms steer us back to similar content in echo chambers that inhibit both critical and creative thinking. Platforms incentivized to keep users scrolling discourage long-looking and render users as passive consumers, rather than active seekers of inspiration. They aren't a space for productive feedback, either: Art takes on a different tone when it's surrounded by dog GIFs, political memes, and your cousin's baby photos."
"A group of current and former contractors who worked for years at the social network's Berlin-based moderation centres has reported witnessing colleagues become "addicted" to graphic content and hoarding ever more extreme examples for a personal collection. They also said others were pushed towards the far right by the amount of hate speech and fake news they read every day."
"If we assume that these developments continue, and with them our interest in creating simulations of the world, then at some point in the future - 1,000 years, 100,000 years - it's reasonable to assume that the difference between reality and simulation will become indistinguishable. At which point it will mean we will have created simulated beings with their own consciousness.
Advertisement
But if that is the inevitable outcome of continued technological advancement, unless nuclear war or some other catastrophe intervenes, then it's quite possible - some would say an overwhelming certainty - that it's already happened, and we are the ancestor simulations created by an advanced post-human civilisation."